Approximate Message Passing algorithms for rotationally invariant matrices
نویسندگان
چکیده
Approximate Message Passing (AMP) algorithms have seen widespread use across a variety of applications. However, the precise forms for their Onsager corrections and state evolutions depend on properties underlying random matrix ensemble, limiting extent to which AMP derived white noise may be applicable data matrices that arise in practice. In this work, we study more general W satisfy orthogonal rotational invariance law, where spectral distribution is different from semicircle Marcenko–Pastur laws characteristic noise. The these are defined by free cumulants or rectangular W. Their were previously Opper, Çakmak Winther using nonrigorous dynamic functional theory techniques, provide rigorous proofs. Our motivating application Bayes-AMP algorithm Principal Components Analysis, when there prior structure principal components (PCs) possibly nonwhite For sufficiently large signal strengths any non-Gaussian distributions PCs, show provably achieves higher estimation accuracy than sample PCs.
منابع مشابه
Approximate Message Passing Algorithms for Generalized Bilinear Inference
Recent developments in compressive sensing (CS) combined with increasing demands for effective high-dimensional inference techniques across a variety of disciplines have motivated extensive research into algorithms exploiting various notions of parsimony, including sparsity and low-rank constraints. In this dissertation, we extend the generalized approximate message passing (GAMP) approach, ori...
متن کاملApproximate Message Passing
In this note, I summarize Sections 5.1 and 5.2 of Arian Maleki’s PhD thesis. 1 Notation We denote scalars by small letters e.g. a, b, c, . . ., vectors by boldface small letters e.g. λ,α,x, . . ., matrices by boldface capital letter e.g. A,B,C, . . ., (subsets of) natural numbers by capital letters e.g. N,M, . . .. We denote i’th element of a vector a by ai and (i, j)’th entry of a matrix A by ...
متن کاملParameterless Optimal Approximate Message Passing
Iterative thresholding algorithms are well-suited for high-dimensional problems in sparse recovery and compressive sensing. The performance of this class of algorithms depends heavily on the tuning of certain threshold parameters. In particular, both the final reconstruction error and the convergence rate of the algorithm crucially rely on how the threshold parameter is set at each step of the ...
متن کاملBilinear Generalized Approximate Message Passing
We extend the generalized approximate message passing (G-AMP) approach, originally proposed for highdimensional generalized-linear regression in the context of compressive sensing, to the generalized-bilinear case, which enables its application to matrix completion, robust PCA, dictionary learning, and related matrix-factorization problems. In the first part of the paper, we derive our Bilinear...
متن کاملSwept Approximate Message Passing for Sparse Estimation
Approximate Message Passing (AMP) has been shown to be a superior method for inference problems, such as the recovery of signals from sets of noisy, lower-dimensionality measurements, both in terms of reconstruction accuracy and in computational efficiency. However, AMP suffers from serious convergence issues in contexts that do not exactly match its assumptions. We propose a new approach to st...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Annals of Statistics
سال: 2022
ISSN: ['0090-5364', '2168-8966']
DOI: https://doi.org/10.1214/21-aos2101